# High-Precision Semantic Matching
Bge Reranker Ft
This is a cross-encoder model fine-tuned from BAAI/bge-reranker-base, designed for scoring text pairs, suitable for text reordering and semantic search tasks.
Text Embedding
B
foochun
70
0
GTE ModernColBERT V1
Apache-2.0
PyLate is a sentence similarity model based on the ColBERT architecture, using Alibaba-NLP/gte-modernbert-base as the base model and trained with distillation loss, suitable for information retrieval tasks.
Text Embedding
G
lightonai
157.96k
98
Bge Large Zh V1.5 GGUF
MIT
BAAI/bge-large-zh-v1.5 is a Chinese sentence transformer model primarily used for feature extraction and sentence similarity calculation.
Text Embedding Chinese
B
mradermacher
536
1
Reranker ModernBERT Base Gooaq Bce
Apache-2.0
This is a cross-encoder model fine-tuned from ModernBERT-base for text re-ranking and semantic search tasks.
Text Embedding English
R
akr2002
16
1
Ruri V3 Reranker 310m Preview
Apache-2.0
This is a preview version of a Japanese general-purpose reranking model, trained on the cl-nagoya/ruri-v3-pt-310m base model, specifically designed for Japanese text relevance ranking tasks.
Text Embedding Japanese
R
cl-nagoya
79
0
Muffakir Embedding
An Arabic sentence transformer trained on Egyptian legal books and synthetic data, optimized for semantic text similarity and information retrieval tasks.
Text Embedding Arabic
M
mohamed2811
332
1
Stella En 400M V5 FinanceRAG V2
A finance-domain retrieval-augmented generation model optimized based on the stella_en_400M_v5 architecture, supporting semantic retrieval and paragraph matching for financial documents
Large Language Model Other
S
thomaskim1130
555
6
Marsilia Embeddings FR Base
MIT
Marsilia-Embeddings-FR-Base is a French embedding model specifically designed for financial domain tasks, demonstrating the importance of task-specific fine-tuning for embedding models in Retrieval-Augmented Generation (RAG) applications.
Text Embedding
Transformers French

M
sujet-ai
31
4
SFR Embedding Mistral
A text embedding model developed by Salesforce Research, trained on E5-mistral-7b-instruct and Mistral-7B-v0.1, primarily used for text retrieval tasks.
Text Embedding
Transformers English

S
Salesforce
34.75k
277
Rankcse Listmle Bert Base Uncased
Apache-2.0
This dataset is used for training and evaluating the SimCSE (Simple Contrastive Learning of Sentence Embeddings) model, supporting sentence similarity tasks.
Text Embedding
Transformers English

R
perceptiveshawty
20
0
Dragon Plus Query Encoder
DRAGON+ is a dense retrieval model based on the BERT architecture, with initial weights derived from RetroMAE and trained on enhanced data from the MS MARCO corpus.
Text Embedding
Transformers

D
facebook
3,918
20
Featured Recommended AI Models